翻訳と辞書
Words near each other
・ Evon Dickson
・ Evon McInnis
・ Evon Z. Vogt
・ Evon Zartman Vogt Ranch House
・ Evonik Industries
・ Evonium
・ Evonne
・ Evonne Goolagong career statistics
・ Evonne Goolagong Cawley
・ Evonne Hsu
・ Evony
・ Evoor
・ Evoor Major Sri krishnaswamy temple
・ EVOP
・ Evolutionarily stable strategy
Evolutionary acquisition of neural topologies
・ Evolutionary aesthetics
・ Evolutionary Air and Space Global Laser Engagement
・ Evolutionary algorithm
・ Evolutionary Algorithm for Landmark Detection
・ Evolutionary anachronism
・ Evolutionary anthropology
・ Evolutionary Anthropology (journal)
・ Evolutionary approaches to depression
・ Evolutionary argument against naturalism
・ Evolutionary arms race
・ Evolutionary art
・ Evolutionary baggage
・ Evolutionary Behavioral Sciences
・ Evolutionary Bioinformatics


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

Evolutionary acquisition of neural topologies : ウィキペディア英語版
Evolutionary acquisition of neural topologies
This article is on evolutionary acquisition of artificial neural topologies, not of natural ones.
Evolutionary acquisition of neural topologies (EANT/EANT2) is an evolutionary reinforcement learning method that evolves both the topology and weights of artificial neural networks. It is closely related to the works of Angeline et al.〔Peter J Angeline, Gregory M Saunders, and Jordan B Pollack. An evolutionary algorithm that constructs recurrent neural networks. IEEE Transactions on Neural Networks, 5:54–65, 1994. ()〕 and Stanley and Miikkulainen.〔NeuroEvolution of Augmented Topologies (NEAT) by Stanley and Miikkulainen, 2005 ()〕 Like the work of Angeline et al., the method uses a type of parametric mutation that comes from evolution strategies and evolutionary programming (now using the most advanced form of the evolution strategies CMA-ES in EANT2), in which adaptive step sizes are used for optimizing the weights of the neural networks. Similar to the work of Stanley (NEAT), the method starts with minimal structures which gain complexity along the evolution path.
==Contribution of EANT to neuroevolution==

Despite sharing these two properties, the method has the following important features which distinguish it from previous works in neuroevolution.
It introduces a genetic encoding called common genetic encoding (CGE) that handles both direct and indirect encoding of neural networks within the same theoretical framework. The encoding has important properties that makes it suitable for evolving neural networks:
# It is ''complete'' in that it is able to represent all types of valid phenotype networks.
# It is ''closed'', i.e. every valid genotype represents a valid phenotype. (Similarly, the encoding is ''closed under genetic operators'' such as structural mutation and crossover.)
These properties have been formally proven in.〔Yohannes Kassahun, Mark Edgington, Jan Hendrik Metzen, Gerald Sommer and Frank Kirchner. Common Genetic Encoding for Both Direct and Indirect Encodings of Networks. In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO 2007), London, UK, 1029–1036, 2007.()〕
For evolving the structure and weights of neural networks, an evolutionary process is used, where the ''exploration'' of structures is executed at a larger timescale (structural exploration), and the ''exploitation'' of existing structures is done at a smaller timescale (structural exploitation). In the structural exploration phase, new neural structures are developed by gradually adding new structures to an initially minimal network that is used as a starting point. In the structural exploitation phase, the weights of the currently available structures are optimized using an evolution strategy.

抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「Evolutionary acquisition of neural topologies」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.